Superforecasting: The Art and Science of Prediction

Superforecasting: The Art and Science of Prediction

  • Downloads:8551
  • Type:Epub+TxT+PDF+Mobi
  • Create Date:2021-04-01 14:53:36
  • Update Date:2025-09-06
  • Status:finish
  • Author:Philip E. Tetlock
  • ISBN:1847947158
  • Environment:PC/Android/iPhone/iPad/Kindle

Summary

The international bestseller

'A manual for thinking clearly in an uncertain world。 Read it。' Daniel Kahneman, author of Thinking, Fast and Slow
_________________________
What if we could improve our ability to predict the future?

Everything we do involves forecasts about how the future will unfold。 Whether buying a new house or changing job, designing a new product or getting married, our decisions are governed by implicit predictions of how things are likely to turn out。 The problem is, we're not very good at it。

In a landmark, twenty-year study, Wharton professor Philip Tetlock showed that the average expert was only slightly better at predicting the future than a layperson using random guesswork。 Tetlock's latest project – an unprecedented, government-funded forecasting tournament involving over a million individual predictions – has since shown that there are, however, some people with real, demonstrable foresight。 These are ordinary people, from former ballroom dancers to retired computer programmers, who have an extraordinary ability to predict the future with a degree of accuracy 60% greater than average。 They are superforecasters。

In Superforecasting, Tetlock and his co-author Dan Gardner offer a fascinating insight into what we can learn from this elite group。 They show the methods used by these superforecasters which enable them to outperform even professional intelligence analysts with access to classified data。 And they offer practical advice on how we can all use these methods for our own benefit – whether in business, in international affairs, or in everyday life。
_________________________
'The techniques and habits of mind set out in this book are a gift to anyone who has to think about what the future might bring。 In other words, to everyone。' Economist

'A terrific piece of work that deserves to be widely read 。 。 。 Highly recommended。' Independent

'The best thing I have read on predictions 。 。 。 Superforecasting is an indispensable guide to this indispensable activity。' The Times

Download

Reviews

Walter Cavinaw

This has become a must-read, and although I've put it off for a while I am glad I got around to it eventually。 We all make forecasts。 It's unavoidable。 In almost all decisions we make there are implicit and explicit beliefs about what will or won't happen in the world。 This book tells us how we can forecast better。 Unfortunately it also has a lot of overlap with other recent content on decision-making。The author wrote his first book about the forecasting performance of pundits which showed that This has become a must-read, and although I've put it off for a while I am glad I got around to it eventually。 We all make forecasts。 It's unavoidable。 In almost all decisions we make there are implicit and explicit beliefs about what will or won't happen in the world。 This book tells us how we can forecast better。 Unfortunately it also has a lot of overlap with other recent content on decision-making。The author wrote his first book about the forecasting performance of pundits which showed that they were very good at creating high confidence narratives, but not good at predicting outcomes。 The second book is the result of the good judgment project, in which people were invited to forecast sociopolitical and economic events。 Surprisingly there were a group of forecasters (Superforecasters) that did much better than other methods of forecasting, including intelligence analysts with classified information。 Although there are limits to how far and how accurately we can predict things, the project demonstrates that it is possible to get (much) better at forecasting。 The book describes how we can get better。In order to improve we need feedback on how we are doing。 To get feedback we need to measure our forecasting performance and understand what went wrong in our forecast by investigating our analysis。 The first step to improving our analysis is to begin recording the thought process that goes into the analysis: the data we are looking at, the models we are using, and the conclusions we came to。 Only by elucidating the analysis can we look back and discover our error。 One common obstacle is the use of vague language (maybe, probably, etc) because it obfuscates what we mean and how well we predicted things。 Being precise, for example using probabilities in estimating outcomes, can help us determine our performance, which can then help us improve。Forecasts can be improved by structuring the analysis better。 Break the problem down into smaller problems that can be more easily answered and build the answer back up。 Similar to how fermi questions should be answered。 You should also start with an outside view based on a generalization。 This can often be the frequency of occurrence of a comparable set of instances, but can also be a simple model such as mean reversion or 'nothing changes'。 After that, start to incorporate the details of your particular question and how it differs from the generalization。Keep up to date on new information。 In particular information related to the variables that are critical to your (mental) model, or information that informs you that your model is misconstructed。 Staying open-minded is important when looking at new information。 Don't let your personal desires get in the way of your analysis。 You have to find the middle ground between overreacting to noise, and under-reacting to information that creates significant change in outcome or that demonstrates your model is wrong。Forecasters working in groups tended to outperform those working alone。 When working in groups try to avoid groupthink, which happens when a common narrative is shared by the group and negatively impacts the quality of analysis。 Challenging each other's assumptions and analysis is necessary but only when it does not erode trust。 Trust is important within the group in order to encourage sharing and constructive criticism for the good of the analytical outcome。 Asking for clarifications on specific issues is better than loaded questions that can hurt egos。At the end of the book, the author summarises the characteristics that are shared by Superforecasters。 They: - see the world as stochastic where many things are possible。 - are open-minded and willing to challenge their beliefs。 - are curious, analytical and like to solve puzzles。- have above average intelligence。- are introspective and reflective。- value a diverse set of views。- thoughtfully update beliefs。- are aware of their cognitive and emotional biases。- have humility in knowing what they do and don't know。- have a growth mindset and look to improve themselves。 。。。more

Kurtis

As a statistics text it’s solid, as a statistician’s “u up?” text to Daniel Kahneman it’s a wild ride。

Al Hudecki

Really enjoyed the insights this book presented into deciphering opaque outcomes in the future。 Simple steps to break down seemingly infinite outcomes into actionable predictions。

Jacob Mainwaring

I thought this was pretty good。 He paints a clear picture of what makes someone a good forecaster (humble, always learning, nuanced, willing to reevaluate their views) from people who are not。 Like Nate Silver‘s book, it’s a comparison of these foxes with hedgehogs, the ideological and close-minded pundits。 I also like his idea of having forecasters’ past predictions be rigorously evaluated rather than blindly trusted。 The main downside is that there’s a fair amount of overlapping information wi I thought this was pretty good。 He paints a clear picture of what makes someone a good forecaster (humble, always learning, nuanced, willing to reevaluate their views) from people who are not。 Like Nate Silver‘s book, it’s a comparison of these foxes with hedgehogs, the ideological and close-minded pundits。 I also like his idea of having forecasters’ past predictions be rigorously evaluated rather than blindly trusted。 The main downside is that there’s a fair amount of overlapping information with Nate Silver‘s book, which came out a few years prior, so a lot of what was in here was not new for me。 。。。more

Devika

I really enjoyed it。 Makes you rethink how you think!

Jordan Berman

Like so many pop-social science books, there is maybe an essay's worth of insight here, heavily padded to fill a book-length manuscript。 The authors spend too much time justifying that super-forecasters out-perform markets out-perform forecasters out-perform so-called "experts。" Anyone who picks up the book is likely to believe this already, and the evidence is overwhelming。What I wanted, instead, was an explanation of how to be a better forecaster, and then how to apply forecasting methods in a Like so many pop-social science books, there is maybe an essay's worth of insight here, heavily padded to fill a book-length manuscript。 The authors spend too much time justifying that super-forecasters out-perform markets out-perform forecasters out-perform so-called "experts。" Anyone who picks up the book is likely to believe this already, and the evidence is overwhelming。What I wanted, instead, was an explanation of how to be a better forecaster, and then how to apply forecasting methods in a wider variety of domains。 The former can be found but you have to cut through the chaff; the latter, not so much。Still, a book worthy of at least a quick read for anyone interested in decision sciences and group dynamics。 。。。more

Rob Lloyd-jones

An excellent book looking at the characteristics of what makes a good forecast/forecaster in an unpredictable world。 Would recommend also reading thinking fast, thinking slow by kleinmann as there are many references to this work。 Taleb's black Swan is also worth a read as it offers a good critique to this book。 Overall, it's nice to take some practical pointer to implement in your own decision making。 An excellent book looking at the characteristics of what makes a good forecast/forecaster in an unpredictable world。 Would recommend also reading thinking fast, thinking slow by kleinmann as there are many references to this work。 Taleb's black Swan is also worth a read as it offers a good critique to this book。 Overall, it's nice to take some practical pointer to implement in your own decision making。 。。。more

Hanna Violet Schwank

Fascinating insights into how predictions are made and why the U。S。 intelligence community could really benefit from evaluating the true effectiveness of their forecasts。 I think the premise and examples were great and it was well-written, but could have been done more concisely。 Though I'm not really complaining—I enjoyed the read! Fascinating insights into how predictions are made and why the U。S。 intelligence community could really benefit from evaluating the true effectiveness of their forecasts。 I think the premise and examples were great and it was well-written, but could have been done more concisely。 Though I'm not really complaining—I enjoyed the read! 。。。more

Xin

I think I have read too many of these scientific predictions books by now, they are starting to repeat each other。The information is still solid, but I am definitely experiencing some hindsight bias。 The good prediction project is interesting to learn about。The writing isn’t very engaging, and gets really dense even surrounding people narratives, which suggests verbose writer to me。 I feel like the same info could be conveyed in a much shorter book。

Eric Thim

I read this because 76ers GM Daryl Morey recommends it, and it is excellent。It’s a book about how we make predictions (some big, some very small) every day and what we can do to make our predictions as accurate as possible。 It summarizes a multi-year prediction study/competition and using the resulting data combined with participant interviews, attempts to explain the traits and behaviors that make “superforecasters” more accurate than the average guy on the street。 It provides a blueprint for h I read this because 76ers GM Daryl Morey recommends it, and it is excellent。It’s a book about how we make predictions (some big, some very small) every day and what we can do to make our predictions as accurate as possible。 It summarizes a multi-year prediction study/competition and using the resulting data combined with participant interviews, attempts to explain the traits and behaviors that make “superforecasters” more accurate than the average guy on the street。 It provides a blueprint for how we can make better predictions and I think there are a lot of great actionable recommendations most readers can take away that will improve their daily through processes。A+, go sixers。 。。。more

Soni S

I don't delve much into non-fiction but enjoyed this one。 Forecasting feels unknown, when I hear these forecasts, I always wonder what the thought process is, and this book sheds some light on that。 If you're interested in the pyschology behind decision making, then this one is for you! I don't delve much into non-fiction but enjoyed this one。 Forecasting feels unknown, when I hear these forecasts, I always wonder what the thought process is, and this book sheds some light on that。 If you're interested in the pyschology behind decision making, then this one is for you! 。。。more

John Beshir

If you've read Thinking Fast and Slow and similar books there might be a lot here that isn't new to you early on, and it repeats a little too much to be a good refresher, but all in all it's a good summary of a lot of ways to improve your estimation- breaking things down with Fermization, starting with a base rate, regular updating, and so on。If you're already the sort of person who engages with Metaculus or PredictionBook, you might find it's mostly outlining things you know, but it could be a If you've read Thinking Fast and Slow and similar books there might be a lot here that isn't new to you early on, and it repeats a little too much to be a good refresher, but all in all it's a good summary of a lot of ways to improve your estimation- breaking things down with Fermization, starting with a base rate, regular updating, and so on。If you're already the sort of person who engages with Metaculus or PredictionBook, you might find it's mostly outlining things you know, but it could be a way to make sure you didn't miss anything basic, or read about steps you don't put much time into- the discussion on regularly updating your estimates in the right way was fairly new to me because I've never bothered with this。If you're new, this book takes things slowly enough and doesn't assume any existing knowledge, giving a basic overview of all the ideas it depends on, so I think it'd be a good introduction to forecasting in a general context。 。。。more

Freddy Thong

Useful and eye opening book。 Has inspired me to read more of the like

Kurt

Really good。

Georges DeChausay

Great for predicting future events with current statistics

Matthew Taaffe

Superforecasting is a curious read; it delves into the characteristics of a rarified group of individuals that Philip Tetlock has identified and nurtured in his professional work。 But it also seems to serve as a defence and advocation of forecasting in general, and the forecasting tournaments Tetlock has set up。The Appendix at the end is a highlight reel of the book's best bits - the ways in which the superforecasters think, and how they tackle tough questions。 I also enjoyed the analysis of how Superforecasting is a curious read; it delves into the characteristics of a rarified group of individuals that Philip Tetlock has identified and nurtured in his professional work。 But it also seems to serve as a defence and advocation of forecasting in general, and the forecasting tournaments Tetlock has set up。The Appendix at the end is a highlight reel of the book's best bits - the ways in which the superforecasters think, and how they tackle tough questions。 I also enjoyed the analysis of how pivotal decisions in history were made - the Cuban missile crisis and Bay of Pigs disaster being the most salient examples here。Where it's less interesting is when it veers to much into the history of the forecasting tournaments, and the more abstract arguments for an against the ideas of forecasting。All in all, the book is very light and easy to read - a lesser book could have easily been much dryer。 It's an interesting inspection of how humans think and how they can think better。 I don't often read this type of nonfiction and I enjoyed it; I'd say if cerebral self-help books are your bag - I'd definitely give it a read。 。。。more

Bryan Alkire

Not very good and disappointing。 The idea was interesting, but the author made it boring。 Additionally, the author spent most of his time describing his research and talking about what he does in the field and name dropping everywhere。 What other research he presents is nothing new which any reader about the topic has probably already read。 Lastly, his recommendations for becoming a superforcaster aren’t new and are pretty self-evident。 So, there’s really nothing to recommend this boo。 It only g Not very good and disappointing。 The idea was interesting, but the author made it boring。 Additionally, the author spent most of his time describing his research and talking about what he does in the field and name dropping everywhere。 What other research he presents is nothing new which any reader about the topic has probably already read。 Lastly, his recommendations for becoming a superforcaster aren’t new and are pretty self-evident。 So, there’s really nothing to recommend this boo。 It only gets a 2 because I found the topic interesting despite the author’s attempt to make it not so interesting。 。。。more

John

Tetlock has no doubt substantially advanced the necessary thought process to forecast, but the readability of this book is comparable to getting a coffee and bumping into an old, loquacious friend。 You kind of just want to get to the end of the conversation。I found the book could be summed up in just a few paragraphs。 Some areas repeated themselves, and points were explained ad nauseam, overreaching for meaning。 Name-dropping throughout the book alluded to conflict between certain experts that h Tetlock has no doubt substantially advanced the necessary thought process to forecast, but the readability of this book is comparable to getting a coffee and bumping into an old, loquacious friend。 You kind of just want to get to the end of the conversation。I found the book could be summed up in just a few paragraphs。 Some areas repeated themselves, and points were explained ad nauseam, overreaching for meaning。 Name-dropping throughout the book alluded to conflict between certain experts that had no meaning to the reader。 The subject matter is interesting on its own, but the book was written in such an informal repetitive manner, it denigrates the reader and perhaps the author’s contributions to society。 。。。more

Alexander Aleksandrov

A good book to complement and remind about key take aways from Daniel Kahneman

Fredrik Lindholm

Some forecasters outperform others。 These are the less dogmatic ones, more keen on challenging their own assumptions。 Taking in other perspectives。Good forecasters focus on where estimations make sense - not 2050 sports winners, but more close, knowable things。 Updating beliefs and taking in outside views is important。 Bayesian logic is the key。 Using Fermi method is good。 He did this by separating what is known from the unknown。 That is the first step。"Using Anchorage And Adopting An External V Some forecasters outperform others。 These are the less dogmatic ones, more keen on challenging their own assumptions。 Taking in other perspectives。Good forecasters focus on where estimations make sense - not 2050 sports winners, but more close, knowable things。 Updating beliefs and taking in outside views is important。 Bayesian logic is the key。 Using Fermi method is good。 He did this by separating what is known from the unknown。 That is the first step。"Using Anchorage And Adopting An External VisionBecause each situation is unique, you need to avoid rushing judgment in a case。 The best way to approach any question is to adopt an outlook。 It means discovering the initial probability of an event。For example, imagine an Italian family living in a modest home in the United States。 The father works as a librarian, and the mother has a part-time job in a daycare center。 They live with their children and with their grandmother。If you were asked what the chances are that this Italian family owns a pet, you could try to answer the question by thinking about the details of their life situation。 But if you think so, you can skip some important things!Rather than looking at the details first, you should start by researching the percentage of American households that own a pet。 In a few seconds, thanks to Google, you will find that this figure is 62%。 That is your outward vision。""The Characteristics Of Good ForecastsThe predictions should be clear; it should be easy for any observer to agree or disagree with you。" 。。。more

Ryan

Too good!

Isaac Pritchett

About how to predict the future。 Born of a 25 year long study that showed that outside of ~9 months, nobody could predict anything better than pure randomness。 But within that, if we think carefully and manage internal biases, we can do a lot better。 Tells us how。 Good read。

Jackson Curtis

This book was similar to the listening to the winner of the elementary school science fair report his findings。 He had an interesting research question, he checked all the boxes, but you come away feeling like he didn’t actually contribute anything meaningful that’s really going to push the field forward。

M。

This is effectively a long sales letter, a pitch to join Tetlock's Good Judgement Project。 I was not sold, because the book didn't get close to answering the main question I had while reading: If I spend a(n effectively unpaid) year on this and am average or worse than the crowd in my predictions, will the time have been well spent?The book is focused on the project's top performers, the 'superforecasters' of the title, so this doesn't come up。 There is one person featured who is not labeled as This is effectively a long sales letter, a pitch to join Tetlock's Good Judgement Project。 I was not sold, because the book didn't get close to answering the main question I had while reading: If I spend a(n effectively unpaid) year on this and am average or worse than the crowd in my predictions, will the time have been well spent?The book is focused on the project's top performers, the 'superforecasters' of the title, so this doesn't come up。 There is one person featured who is not labeled as a 'superforecaster,' but they are clearly above average and steadily improving。 Not the average or worse performer I'm guessing I might be。There are a couple of takeaways I found useful: 1) Ignore predictions that aren't precise enough to be verified as correct or incorrect at some point, and 2) ignore long term predictions - those with a timeline longer than 5 years。 I'll make an exception to 2) in cases where predictions depend on demographic trends, but these seem like a couple of good rules of thumb。 。。。more

David

Topic of the book (which is about large research project carried out by the author) is interesting, However, the book is written in the popular style of demonstrating various concepts And ideas via anecdotal stories, which is engaging but intellectually sparse。 I felt I could have gotten all the intellectual content in 30 minute summary (or by reading this post https://aiimpacts。org/evidence-on-goo。。。 ) rather than 9 hours audiobook。 Topic of the book (which is about large research project carried out by the author) is interesting, However, the book is written in the popular style of demonstrating various concepts And ideas via anecdotal stories, which is engaging but intellectually sparse。 I felt I could have gotten all the intellectual content in 30 minute summary (or by reading this post https://aiimpacts。org/evidence-on-goo。。。 ) rather than 9 hours audiobook。 。。。more

Dominic Bauer

Very interesting book。 Really enjoyed it, although the author repeats himself a lot it was a good reminder of psychological pitfalls we run into on a continuous basis。 It also made me consider things in a different light。 All and all definitely worth a read :D!

Ross Beck-macneil

(read as e-book)I really enjoyed this book。 Does a good job of explaining how to make predictions in an uncertain world。 Lays out qualities and practices that are needed。 Some key takeways:- Forecasts need to be precise, have probabilities attached to them- This allows them to be evaluated and improved- Should make bayesian style updates to forecasts as new information becomes available- Should break down large problems into more manageable problems/questions (fermi estimation)- Better to have f (read as e-book)I really enjoyed this book。 Does a good job of explaining how to make predictions in an uncertain world。 Lays out qualities and practices that are needed。 Some key takeways:- Forecasts need to be precise, have probabilities attached to them- This allows them to be evaluated and improved- Should make bayesian style updates to forecasts as new information becomes available- Should break down large problems into more manageable problems/questions (fermi estimation)- Better to have fine tuned probabilities- Be in perpetual beta, always looking to improveWhile it is clear that better forecasts can be very useful in a lot of different environments, it isn't clear how this kind of thinking can be applied to everyday life。 I can see how it can come up, but cultivating these skills might not be worth it for the majority of people。 。。。more

Lakshmi

Tetlock is a unique combination of a great academic social scientist who can also write popular best sellers, this is clearly such a book。 His research overlaps quite a bit with Daniel Kahneman's but is distinct enough to warrant a read even if you're familiar with that literature。 Takeaways for me: 1。 Scope your forecasting questions and add a probability with your forecasting。 So "will there be a pandemic" is a bad question, but "Will there be a pandemic that is caused by a virus from bats in Tetlock is a unique combination of a great academic social scientist who can also write popular best sellers, this is clearly such a book。 His research overlaps quite a bit with Daniel Kahneman's but is distinct enough to warrant a read even if you're familiar with that literature。 Takeaways for me: 1。 Scope your forecasting questions and add a probability with your forecasting。 So "will there be a pandemic" is a bad question, but "Will there be a pandemic that is caused by a virus from bats in the next decade?" is a very good question。 Assigning probabilities to the answer to that question makes it even better。 2。 Wisdom of the crowds is usually true as long as the crowd is composed of people who know something about the topic。 So an average answer of the experts of a security council is a better answer than the answer of one of those experts。 But the average of random people who aren't security experts isn't。 Seems obvious but he has a nice story of how Obama's nat sec advisors had a wide range of probability estimates on the presence of Osama Bin Laden in the compound in Pakistan。 That said, experts are quite bad at updating their beliefs (#3 below) which makes them wrong a lot。 3。 Constantly update your beliefs - basically use Bayesian updates for everything in life, and don't be egotistical here。 Keep a journal of your predictions and its accuracy if necessary。 There are a number of interesting historical anecdotes throughout the book that make it enjoyable to read。 The book also tells the story of the Good Judgement Project (GJP) that Telock and his colleagues at Wharton run。Also Tyler Cowen has a lovely interview with Tetlock which was what motivated me to read the book in the first place here https://conversationswithtyler。com/ep。。。 。。。more

Library of

Appreciated the book, a good addition to Thinking Fast and Slow for anyone interested in forecasting and/or decision making。 Below are my notes。 https://libraryof。xyz/portfolio/super。。。Philip Tetlock has tested a large population’s forecasting ability through tests performed over a long period of time。 The results show that about 2% is what he calls “Superforecasters” (SF) – these people have a real, measurable ability to assess how high-stakes events are likely to develop three months, six mont Appreciated the book, a good addition to Thinking Fast and Slow for anyone interested in forecasting and/or decision making。 Below are my notes。 https://libraryof。xyz/portfolio/super。。。Philip Tetlock has tested a large population’s forecasting ability through tests performed over a long period of time。 The results show that about 2% is what he calls “Superforecasters” (SF) – these people have a real, measurable ability to assess how high-stakes events are likely to develop three months, six months, a year and a half and a half ahead。 Foresight is a product of a way of thinking, gathering information and updating perceptions。 This habit can be learned and developed。 You can test yourself: www。goodjudgement。comINTELLIGENT, BUT NOT GENIUSES。 The population that took the test had a higher score on intelligence and knowledge tests than 70% of the total population。 The SF’s had higher than 80% of the population, well above average, but most of them well below the so-called “ingenious territory” (often arbitrarily defined as the best 1%, or IQ of +135)。THINK PROBABILISTICALLY。 The best forecasters, according to Philip Tetlock, are good at numbers but rarely use mathematical models or formulas。 They think in terms of probability。 They know that the difference between the good and the amateurs is that the former know the difference between a 60/40 bet and a 40/60 bet (in poker, investing and management)。PROBABILITY IS AN INTUITION THAT REQUIRES EXPERTISE。 Bridge players can develop well-calibrated judgment, but research shows that what is calibrated in one context is rarely transferred well to another。 To get better at a certain type of forecast, that specific one should be repeated with good feedback。DEFECTIVE IMPULSES。 System 1 is fast and always running in the background。 If you get a question and you immediately know the answer, it comes from system 1。 It is designed to jump to conclusions based on some evidence。 System 2 is more logical and challenges that answer – is it fact-based? This takes time and energy and sometimes it is not activated。 Researchers have come to the conclusion that people who assume that their initial assessment is wrong, and then make another estimate in combination with the first, often improve the accuracy。 The same effect applies if there are a few weeks between the first and second estimate。Beliefs are hypotheses to be tested, not treasures to be guarded。HAPPINESS IS GIVEN TO THE PREPARED。 Randomized controlled trials have shown that mastering the contents of a small booklet can improve your accuracy by about 10%。 It shows that “Fortunes favors the prepared mind”。 But there is much we do not know。 There is a three-stage way to quickly separate what is “knowable” and what is “unknowable”。1。 FERMI-IZE。 Physicist Enrico Fermi conducted an experiment with his students, who did not have Google or other aids, if they could guess the number of piano tuners in Chicago。 To break down the answer, we need to guess (1) the number of pianos in Chicago, (2) how often the piano is tuned each year, (3) how long it takes to tune a piano and (4) how many hours per year a piano tuner works。2。 BASE RATE – OUTSIDE IN。 The base rate is how common something is within a broader class。 To answer how likely a family is to have a pet, one must start with how likely an American household is to have a pet。3。 DIRECTED AND PURPOSEFUL INVESTIGATION。 The SF’s often use several different analytical tools and seek information from several sources to later synthesize it into a single conclusion。PERPETUAL UPDATE。 A forecast that is updated to reflect the latest available information is likely to be closer to the truth than a forecast that is less based on current data。 The Bayesian theorem states that your new belief should depend on your previous beliefs (and all the knowledge that informed about it) multiplied by the “diagnostic value” of the new information。SUPERFORECASTERS IN A NUTSHELL。 A superforecaster is cautious (nothing is certain), humble (reality is infinitely complex) and non-deterministic (there are several potential outcomes)。 In their abilities and ways of thinking, they tend to be open-minded (perceptions are hypotheses to test), intelligent and knowledgeable (curious) and reflective (introspective and self-critical)。 They are comfortable with numbers, pragmatic in their methods and analytical。 In addition, they are “dragonfly-eyed” (values different views and synthesizes them into their own), probabilistic (many layers of “maybe”), well-thought-out updaters (when facts change, they change their view) and good intuitive psychologists (test thinking for biases)。 。。。more

Euan Ross

Interesting that Dominic Cummings refers to this book with almost biblical adoration。 Can certainly see how it gave him the political foresight he needed over his opponents/the commentariat to achieve his aim。 Though he did evidently skip the book’s lesson on humility and leadership which prevented him from listening to expert advice early pandemic。 Silly Dom 🥸